differentiable representation
Adaptive Neural Compilation
This paper proposes an adaptive neural-compilation framework to address the problem of learning efficient program. Traditional code optimisation strategies used in compilers are based on applying pre-specified set of transformations that make the code faster to execute without changing its semantics. In contrast, our work involves adapting programs to make them more efficient while considering correctness only on a target input distribution. Our approach is inspired by the recent works on differentiable representations of programs. We show that it is possible to compile programs written in a low-level language to a differentiable representation. We also show how programs in this representation can be optimised to make them efficient on a target distribution of inputs. Experimental results demonstrate that our approach enables learning specifically-tuned algorithms for given data distributions with a high success rate.
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > United States > California (0.04)
- Research Report > Experimental Study (0.93)
- Workflow (0.67)
Adaptive Neural Compilation
This paper proposes an adaptive neural-compilation framework to address the problem of learning efficient program. Traditional code optimisation strategies used in compilers are based on applying pre-specified set of transformations that make the code faster to execute without changing its semantics. In contrast, our work involves adapting programs to make them more efficient while considering correctness only on a target input distribution. Our approach is inspired by the recent works on differentiable representations of programs. We show that it is possible to compile programs written in a low-level language to a differentiable representation. We also show how programs in this representation can be optimised to make them efficient on a target distribution of inputs. Experimental results demonstrate that our approach enables learning specifically-tuned algorithms for given data distributions with a high success rate.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- Europe > Spain > Catalonia > Barcelona Province > Barcelona (0.04)
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > United States > California (0.04)
- Research Report > Experimental Study (0.93)
- Workflow (0.67)
Differentiable Economics: Strategic Behavior, Mechanisms, and Machine Learning
Economists have developed different types of models describing the interaction of agents in markets. Early models in general equilibrium theory describe agents taking prices as given and do not consider the incentives of agents to manipulate prices strategically. With appropriate convexity assumptions on the preferences, such models can be cast as convex optimization problems for which efficient algorithms are known to find a competitive equilibrium. Price-taking behavior might be a reasonable approximation of agent behavior in large markets, but it does not adequately capture the incentives and strategies that agents have in smaller markets or in other strategic settings. Modern models in economics, such as those used for modeling auctions, oligopoly competition, or contests, are based on game theory, with the Nash equilibrium as the central solution concept.
Adaptive Neural Compilation
This paper proposes an adaptive neural-compilation framework to address the problem of learning efficient program. Traditional code optimisation strategies used in compilers are based on applying pre-specified set of transformations that make the code faster to execute without changing its semantics. In contrast, our work involves adapting programs to make them more efficient while considering correctness only on a target input distribution. Our approach is inspired by the recent works on differentiable representations of programs. We show that it is possible to compile programs written in a low-level language to a differentiable representation. We also show how programs in this representation can be optimised to make them efficient on a target distribution of inputs.
Diffusing Differentiable Representations
Savani, Yash, Finzi, Marc, Kolter, J. Zico
We introduce a novel, training-free method for sampling differentiable representations (diffreps) using pretrained diffusion models. Rather than merely mode-seeking, our method achieves sampling by "pulling back" the dynamics of the reverse-time process--from the image space to the diffrep parameter space--and updating the parameters according to this pulled-back process. We identify an implicit constraint on the samples induced by the diffrep and demonstrate that addressing this constraint significantly improves the consistency and detail of the generated objects. Our method yields diffreps with substantially improved quality and diversity for images, panoramas, and 3D NeRFs compared to existing techniques. Our approach is a general-purpose method for sampling diffreps, expanding the scope of problems that diffusion models can tackle.
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > United States > California (0.04)
- Research Report (0.84)
- Workflow (0.67)
- Information Technology > Artificial Intelligence > Vision (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Sensing and Signal Processing > Image Processing (0.88)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.68)
Computing distances and means on manifolds with a metric-constrained Eikonal approach
Computing distances on Riemannian manifolds is a challenging problem with numerous applications, from physics, through statistics, to machine learning. In this paper, we introduce the metric-constrained Eikonal solver to obtain continuous, differentiable representations of distance functions (geodesics) on manifolds. The differentiable nature of these representations allows for the direct computation of globally length-minimising paths on the manifold. We showcase the use of metric-constrained Eikonal solvers for a range of manifolds and demonstrate the applications. First, we demonstrate that metric-constrained Eikonal solvers can be used to obtain the Fréchet mean on a manifold, employing the definition of a Gaussian mixture model, which has an analytical solution to verify the numerical results. Second, we demonstrate how the obtained distance function can be used to conduct unsupervised clustering on the manifold - a task for which existing approaches are computationally prohibitive. This work opens opportunities for distance computations on manifolds.
- North America > United States > New York > New York County > New York City (0.14)
- Europe > United Kingdom > England > Greater London > London (0.04)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- (3 more...)
Adaptive Neural Compilation
This paper proposes an adaptive neural-compilation framework to address the problem of learning efficient programs. Traditional code optimisation strategies used in compilers are based on applying pre-specified set of transformations that make the code faster to execute without changing its semantics. In contrast, our work involves adapting programs to make them more efficient while considering correctness only on a target input distribution. Our approach is inspired by the recent works on differentiable representations of programs. We show that it is possible to compile programs written in a low-level language to a differentiable representation. We also show how programs in this representation can be optimised to make them efficient on a target input distribution. Experimental results demonstrate that our approach enables learning specifically-tuned algorithms for given data distributions with a high success rate.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- Europe > Spain > Catalonia > Barcelona Province > Barcelona (0.04)